shell 练习-日志汇总

我有一个日志(php的slow_log) 几乎每分钟都有输出信息。需要我们来写个脚本分析一下它,目的就是为了归类汇总,按照它们出现的频次做个排序,假如日志是每天0点5分清空,那么按照每小时一次汇总分析该日志,最后在第二天0点0分时,再汇总一下整天的日志,怎么写呢? 下面我给出一个日志样例和我写的分析汇总脚本供参考。日志样例:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
[24-Oct-2013 00:05:39] [pool www.lishiming.net] pid 19101
script_filename = /data/release/www.lishiming.net/forum.php
[0x00007f9279237e98] mysql_unbuffered_query() /data/release/www.lishiming.net/source/class/db/db_driver_mysql.php:147
[0x00007f92792377a8] query() /data/release/www.lishiming.net/source/class/discuz/discuz_database.php:136
[0x00007f9279236f40] query() /data/release/www.lishiming.net/source/class/table/table_forum_thread.php:932
[0x00007f92792365e8] increase() /data/release/www.lishiming.net/source/module/forum/forum_viewthread.php:1034
[0x00007f9279218f48] viewthread_updateviews() /data/release/www.lishiming.net/source/module/forum/forum_viewthread.php:353
[0x00007f9279218050] +++ dump failed
[24-Oct-2013 00:05:39] [pool www.lishiming.net] pid 19754
script_filename = /data/release/www.lishiming.net/forum.php
[0x00007f9279237938] mysql_query() /data/release/www.lishiming.net/source/class/db/db_driver_mysql.php:147
[0x00007f9279237248] query() /data/release/www.lishiming.net/source/class/discuz/discuz_database.php:136
[0x00007f9279236dc0] query() /data/release/www.lishiming.net/source/class/discuz/discuz_database.php:100
[0x00007f9279235d48] fetch_all() /data/release/www.lishiming.net/source/class/table/table_forum_thread.php:523
[0x00007f9279218f48] fetch_all_search() /data/release/www.lishiming.net/source/module/forum/forum_forumdisplay.php:637
[0x00007f9279218050] +++ dump failed
[24-Oct-2013 00:06:07] [pool www.lishiming.net] pid 22624
script_filename = /data/release/www.lishiming.net/forum.php
[0x00007f9279237938] mysql_query() /data/release/www.lishiming.net/source/class/db/db_driver_mysql.php:147
[0x00007f9279237248] query() /data/release/www.lishiming.net/source/class/discuz/discuz_database.php:136
[0x00007f9279236dc0] query() /data/release/www.lishiming.net/source/class/discuz/discuz_database.php:100
[0x00007f9279235d48] fetch_all() /data/release/www.lishiming.net/source/class/table/table_forum_thread.php:523
[0x00007f9279218f48] fetch_all_search() /data/release/www.lishiming.net/source/module/forum/forum_forumdisplay.php:637
[0x00007f9279218050] +++ dump failed
[24-Oct-2013 00:06:18] [pool www.lishiming.net] pid 22624
script_filename = /data/release/www.lishiming.net/forum.php
[0x00007f9279237938] mysql_query() /data/release/www.lishiming.net/source/class/db/db_driver_mysql.php:147
[0x00007f9279237248] query() /data/release/www.lishiming.net/source/class/discuz/discuz_database.php:136
[0x00007f9279236dc0] query() /data/release/www.lishiming.net/source/class/discuz/discuz_database.php:100
[0x00007f9279235d48] fetch_all() /data/release/www.lishiming.net/source/class/table/table_forum_thread.php:523
[0x00007f9279218f48] fetch_all_search() /data/release/www.lishiming.net/source/module/forum/forum_forumdisplay.php:637
[0x00007f9279218050] +++ dump failed

脚本:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
#! /bin/bash
slow_log=/usr/local/php/log/php.slow
d=`date -d "-1 minute" +%H:%M`
d_h=`date +%H`
d_m=`date +%M`
d_d=`date +%Y%d`
d_d2=`date -d "-1 day" +%Y%d`
logdir="/log/php_slow/$d_d"
logdir2="/log/php_slow/$d_d2"
[ -d $logdir ] || mkdir -p $logdir
[ -d $logdir2 ] || mkdir -p $logdir2
if [ $d_m -eq "00" ]; then
d1=`date -d "-1 hour" +%H`
n1=`grep -n " $d1:[0-9][0-9]:" $slow_log|head -n1 |awk -F':' '{print $1}'`
n2=`wc -l $slow_log |awk '{print $1}'`
n3=$[$n2-$n1]
tail -n $n3 $slow_log>/tmp/1.txt
sed 's/\[0x.*\]//g' /tmp/1.txt |xargs > /tmp/2.txt
n=`grep '\[pool' /tmp/1.txt|wc -l`
for i in `seq 1 $n`; do awk -F '+++ dump failed' '{print $'"$i"'}' /tmp/2.txt; done > /tmp/3.txt
if [ $d_h -ne "00" ]; then
sed 's/^.*script_filename = //' /tmp/3.txt |grep -v '^$'|sort |uniq -c |sort -rn > $logdir/$d1\_slow_log
else
sed 's/^.*script_filename = //' /tmp/3.txt |grep -v '^$'|sort |uniq -c |sort -rn > $logdir2/$d1\_slow_log
sed 's/^.*[0-9] \//\//' $logdir2/*_log |sort |uniq -c |sort -rn > $logdir2/$d_d2\_slow_log
fi
fi